home All News open_in_new Full Article

 Cerebras Announces Six New AI Datacenters Across North America and Europe to Deliver Industry’s Largest Dedicated AI Inference Cloud

New datacenters catapult Cerebras to hyperscale capacity, offering over 40 million tokens/second to enterprises, governments, and developers worldwide SUNNYVALE, Calif. — Cerebras Systems, the pioneer in accelerating generative AI, today announced the launch of six new AI inference datacenters powered by Cerebras Wafer-Scale Engines. These state-of-the-art facilities, equipped with thousands of Cerebras CS-3 systems, are […]



Cerebras Systems is launching six new AI inference datacenters across North America and Europe powered by Cerebras Wafer-Scale Engines. These facilities will offer over 40 million Llama 70B tokens per second, making Cerebras the largest high-speed AI inference cloud provider. The expansion aims to increase aggregate capacity by 20x to meet rising customer demand.

today 67 h. ago attach_file Economics

attach_file Other
attach_file Other
attach_file Economics
attach_file Politics
attach_file Economics
attach_file Economics
attach_file Economics
attach_file Economics


ID: 2878170424
Add Watch Country

arrow_drop_down